147 research outputs found

    A cautionary note on robust covariance plug-in methods

    Full text link
    Many multivariate statistical methods rely heavily on the sample covariance matrix. It is well known though that the sample covariance matrix is highly non-robust. One popular alternative approach for "robustifying" the multivariate method is to simply replace the role of the covariance matrix with some robust scatter matrix. The aim of this paper is to point out that in some situations certain properties of the covariance matrix are needed for the corresponding robust "plug-in" method to be a valid approach, and that not all scatter matrices necessarily possess these important properties. In particular, the following three multivariate methods are discussed in this paper: independent components analysis, observational regression and graphical modeling. For each case, it is shown that using a symmetrized robust scatter matrix in place of the covariance matrix results in a proper robust multivariate method.Comment: 24 pages, 7 figure

    Multivariate L1 Statistical Methods: The Package MNM

    Get PDF
    In the paper we present an R package MNM dedicated to multivariate data analysis based on the L_1 norm. The analysis proceeds very much as does a traditional multivariate analysis. The regular L_2 norm is just replaced by different L_1 norms, observation vectors are replaced by their (standardized and centered) spatial signs, spatial ranks, and spatial signed-ranks, and so on. The procedures are fairly efficient and robust, and no moment assumptions are needed for asymptotic approximations. The background theory is briefly explained in the multivariate linear regression model case, and the use of the package is illustrated with several examples using the R package MNM.

    New Algorithms for MM-Estimation of Multivariate Scatter and Location

    Get PDF
    We present new algorithms for MM-estimators of multivariate scatter and location and for symmetrized MM-estimators of multivariate scatter. The new algorithms are considerably faster than currently used fixed-point and related algorithms. The main idea is to utilize a second order Taylor expansion of the target functional and to devise a partial Newton-Raphson procedure. In connection with symmetrized MM-estimators we work with incomplete UU-statistics to accelerate our procedures initially

    Fourth Moments and Independent Component Analysis

    Full text link
    In independent component analysis it is assumed that the components of the observed random vector are linear combinations of latent independent random variables, and the aim is then to find an estimate for a transformation matrix back to these independent components. In the engineering literature, there are several traditional estimation procedures based on the use of fourth moments, such as FOBI (fourth order blind identification), JADE (joint approximate diagonalization of eigenmatrices), and FastICA, but the statistical properties of these estimates are not well known. In this paper various independent component functionals based on the fourth moments are discussed in detail, starting with the corresponding optimization problems, deriving the estimating equations and estimation algorithms, and finding asymptotic statistical properties of the estimates. Comparisons of the asymptotic variances of the estimates in wide independent component models show that in most cases JADE and the symmetric version of FastICA perform better than their competitors.Comment: Published at http://dx.doi.org/10.1214/15-STS520 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Tools for Exploring Multivariate Data: The Package ICS

    Get PDF
    Invariant coordinate selection (ICS) has recently been introduced as a method for exploring multivariate data. It includes as a special case a method for recovering the unmixing matrix in independent components analysis (ICA). It also serves as a basis for classes of multivariate nonparametric tests, and as a tool in cluster analysis or blind discrimination. The aim of this paper is to briefly explain the (ICS) method and to illustrate how various applications can be implemented using the R package ICS. Several examples are used to show how the ICS method and ICS package can be used in analyzing a multivariate data set.
    corecore